翻訳と辞書
Words near each other
・ "O" Is for Outlaw
・ "O"-Jung.Ban.Hap.
・ "Ode-to-Napoleon" hexachord
・ "Oh Yeah!" Live
・ "Our Contemporary" regional art exhibition (Leningrad, 1975)
・ "P" Is for Peril
・ "Pimpernel" Smith
・ "Polish death camp" controversy
・ "Pro knigi" ("About books")
・ "Prosopa" Greek Television Awards
・ "Pussy Cats" Starring the Walkmen
・ "Q" Is for Quarry
・ "R" Is for Ricochet
・ "R" The King (2016 film)
・ "Rags" Ragland
・ ! (album)
・ ! (disambiguation)
・ !!
・ !!!
・ !!! (album)
・ !!Destroy-Oh-Boy!!
・ !Action Pact!
・ !Arriba! La Pachanga
・ !Hero
・ !Hero (album)
・ !Kung language
・ !Oka Tokat
・ !PAUS3
・ !T.O.O.H.!
・ !Women Art Revolution


Dictionary Lists
翻訳と辞書 辞書検索 [ 開発暫定版 ]
スポンサード リンク

pairwise independence : ウィキペディア英語版
pairwise independence
In probability theory, a pairwise independent collection of random variables is a set of random variables any two of which are independent.〔Gut, A. (2005) ''Probability: a Graduate Course'', Springer-Verlag. ISBN 0-387-27332-8. pp. 71–72.〕 Any collection of mutually independent random variables is pairwise independent, but some pairwise independent collections are not mutually independent. Pairwise independent random variables with finite variance are uncorrelated.
A pair of random variables ''X'' and ''Y'' are independent if and only if the random vector (''X'', ''Y'') with joint cumulative distribution function (CDF) F_(x,y) satisfies
:F_(x,y) = F_X(x) F_Y(y),
or equivalently, their joint density f_(x,y) satisfies
:f_(x,y) = f_X(x) f_Y(y).
That is, the joint distribution is equal to the product of the marginal distributions.〔 Definition 2.5.1, page 109.〕
Unless it is not clear in context, in practice the modifier "mutual" is usually dropped so that independence means mutual independence. A statement such as " ''X'', ''Y'', ''Z'' are independent random variables" means that ''X'', ''Y'', ''Z'' are mutually independent.
==Example==

Pairwise independence does not imply mutual independence, as shown by the following example attributed to S. Bernstein.〔 Remark 2.6.1, p. 120.〕
Suppose ''X'' and ''Y'' are two independent tosses of a fair coin, where we designate 1 for heads and 0 for tails. Let the third random variable ''Z'' be equal to 1 if exactly one of those coin tosses resulted in "heads", and 0 otherwise. Then jointly the triple (''X'', ''Y'', ''Z'') has the following probability distribution:
:(X,Y,Z)=\left\\ 1/4, \\
(0,1,1) & \text\ 1/4, \\
(1,0,1) & \text\ 1/4, \\
(1,1,0) & \text\ 1/4.
\end\right.
Here the marginal probability distributions are identical: f_X(0)=f_Y(0)=f_Z(0)=1/2, and
f_X(1)=f_Y(1)=f_Z(1)=1/2. The bivariate distributions also agree: f_=f_=f_, where f_(0,0)=f_(0,1)=f_(1,0)=f_(1,1)=1/4.
Since each of the pairwise joint distributions equals the product of their respective marginal distributions, the variables are pairwise independent:
* ''X'' and ''Y'' are independent, and
* ''X'' and ''Z'' are independent, and
* ''Y'' and ''Z'' are independent.
However, ''X'', ''Y'', and ''Z'' are not mutually independent, since f_(x,y,z) \neq f_X(x)f_Y(y)f_Z(z). Note that any of \ is completely determined by the other two (any of ''X'', ''Y'', ''Z'' is the sum (modulo 2) of the others). That is as far from independence as random variables can get.

抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)
ウィキペディアで「pairwise independence」の詳細全文を読む



スポンサード リンク
翻訳と辞書 : 翻訳のためのインターネットリソース

Copyright(C) kotoba.ne.jp 1997-2016. All Rights Reserved.